Concentration of information content for convex measures
نویسندگان
چکیده
منابع مشابه
Egoroff Theorem for Operator-Valued Measures in Locally Convex Cones
In this paper, we define the almost uniform convergence and the almost everywhere convergence for cone-valued functions with respect to an operator valued measure. We prove the Egoroff theorem for Pvalued functions and operator valued measure θ : R → L(P, Q), where R is a σ-ring of subsets of X≠ ∅, (P, V) is a quasi-full locally convex cone and (Q, W) is a locally ...
متن کاملDynamic Bayesian Information Measures
This paper introduces measures of information for Bayesian analysis when the support of data distribution is truncated progressively. The focus is on the lifetime distributions where the support is truncated at the current age t>=0. Notions of uncertainty and information are presented and operationalized by Shannon entropy, Kullback-Leibler information, and mutual information. Dynamic updatings...
متن کاملOptimal Concentration of Information Content For Log-Concave Densities
An elementary proof is provided of sharp bounds for the varentropy of random vectors with log-concave densities, as well as for deviations of the information content from its mean. These bounds significantly improve on the bounds obtained by Bobkov and Madiman (Ann. Probab., 39(4):1528–1543, 2011). Mathematics Subject Classification (2010). Primary 52A40; Secondary 60E15, 94A17.
متن کاملInformation-theoretic measures of predictability for music content analysis
This thesis is concerned with determining similarity in musical audio, for the purpose of applications in music content analysis. With the aim of determining similarity, we consider the problem of representing temporal structure in music. To represent temporal structure, we propose to compute information-theoretic measures of predictability in sequences. We apply our measures to track-wise repr...
متن کاملInformation Measures via Copula Functions
In applications of differential geometry to problems of parametric inference, the notion of divergence is often used to measure the separation between two parametric densities. Among them, in this paper, we will verify measures such as Kullback-Leibler information, J-divergence, Hellinger distance, -Divergence, … and so on. Properties and results related to distance between probability d...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
ژورنال
عنوان ژورنال: Electronic Journal of Probability
سال: 2020
ISSN: 1083-6489
DOI: 10.1214/20-ejp416